Kullback-Leibler Divergence and the Central Limit Theorem
نویسندگان
چکیده
This paper investigates the asymptotics of Kullback-Leibler divergence between two probability distributions satisfying a Central Limit Theorem property. The basic problem is as follows. Let Xi, i ∈ N, be a sequence of independent random variables such that the sum Sn = ∑n i=1 Xi has the same expected value and satisfies the CLT under each probability distribution. Then what are the asymptotics of the KL divergence between the two distributions on Sn? Under regularity assumptions, we show that this KL divergence tends to a constant which is explicitly identified, along with the rate of convergence. This result holds for several variations of the basic problem.
منابع مشابه
Model Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملThe McMillan Theorem for Colored Branching Processes and Dimensions of Random Fractals
For the simplest colored branching process, we prove an analog to the McMillan theorem and calculate the Hausdorff dimensions of random fractals defined in terms of the limit behavior of empirical measures generated by finite genetic lines. In this setting, the role of Shannon’s entropy is played by the Kullback–Leibler divergence, and the Hausdorff dimensions are computed by means of the so-ca...
متن کاملBerry–Esseen bounds in the entropic central limit theorem
Berry–Esseen-type bounds for total variation and relative entropy distances to the normal law are established for the sums of non-i.i.d. random variables.
متن کاملJ ul 2 00 3 Fisher Information inequalities and the Central Limit Theorem Oliver
We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L2 spaces and Poincaré inequalities, to provide a better understanding of the decrease in Fisher Information implied by results of Barron and Brown. We show that if the standardized Fisher Information ever becomes finite then it conver...
متن کاملN ov 2 00 1 Fisher Information inequalities and the Central Limit Theorem Oliver
We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L2 spaces and Poincaré inequalities, to provide a better understanding of the decrease in Fisher information implied by results of Barron and Brown. We show that if the standardized Fisher information ever becomes finite then it conver...
متن کامل